Up till now, we've been enjoying the default compositing of the image offered in the component. In this tutorial, we will see how we can create our own composition to totally change the way our image is rendered.
Our aim will be to have an environment map around our sphere. We will also modify the program rendering the mesh to reflect this environment.
First, we need to adapt what we currently have. Let's start by changing the texture :
The resource we load is now different, it is a cube map. As such, it is perfect to represent an environment. From the Texture's perspective, nothing has changed, we load it exactly the same way as before.
However, we need to interpret it differently within the HLSL program :
Let's start from the pixel stage, as its required changes impact directly the vertex stage.
First, the Texture2D is now a TextureCube. Cube maps have to be addressed differently within HLSL, because the way to sample from then is not using UVs, but using a 3D direction.
As a result, we will reflect the camera direction on the sphere's surface using the normal, and sample the environment map from this resulting direction.
Which is why we need to alter the vertex stage : it needs to provide the camera direction.
First, the constant buffer features the camera position.
It also receives the vertex normal from the mesh, and not the texture coordinate.
This enables to compute the camera direction from the vertex, and feed it along with the vertex normal to the pixel stage.
And of course, we need to slightly change what the Shader feeds to the Program :
Like the function name implies, we add a slot feeding the camera position during the pass, as the HLSL constant buffer now requires.
Launching the program in this state will already enable us to witness the environment map :
However, let's face it, the sphere feels out of place. The green environment doesn't really help... How could we change that ?
The heart of the image composition is the Compositor. It provides full control over the way an image is composed is possible. There is one available by default, and it's been this one the rendering uses right now.
Let's dig without waiting inside the API :
In there, we include everything we need for the compositor itself, along with the type of passes we will use. Everything will be explained as we go over the code :
First, we create the Compositor as usual, through the manager.
A Compositor is formed by one or more CompositorNode. Those nodes represent operation sets you want to have, and can be toggled on and off easily.
In this case, having only one node inside will be sufficient.
The CompositorNode is composed of TargetOperations. Their aim is to specify which targets are to be altered with the set of passes they are populated with.
In this case, we wish to render to the "back buffer", which is the context's surface in the window. The context also offers a dedicated depth buffer we will use.
Finally, TargetOperation is formed of Pass of different kind.
For our rendering, we want to clear the back and depth buffer first.
Then, we render the scene, aka our sphere currently set within the first rendering queue.
Finally, we request a post process pass. We request it to be a back process, so that it renders only to parts where no mesh is present, and set the shader it will use to process the image.
Final step, we load the compositor for it to prepare and take into account all our changes.
However, we have a missing piece : what shader should the post process use ?
The post process pass is specific in what it does. It will render a square to the target, enabling to act directly on the full image bound.
As such, the program and shader we will need will have some specificities. The code we are about to see should be put before the compositor creation :
The vertex stage takes advantage of the knowledge that we will get a square mapped onto the screen. Each vertex will be a corner of our image.
As such, the constant buffer will expect 4 directions for the camera. Those will correspond to the directions at the 4 corners of the view.
The vertex input takes the position, and the vertex ID, that will be useful to index within the camera directions array.
The vertexID semantic is given within the HLSL program by DirectX. Its value is the index of the vertex being processed.
The post process mesh has been defined so that the vertex indices allow to index directly the array given by the camera directions slot we will define later. Aka, the corners are defined in the same order.
The pixel input will take the final position as usual, and the camera position for a given vertex. This will enable it to be interpolated by the GPU between pixels.
In the function body, the position is set directly, as the square is defined in such a way that nothing specific is required. The camera direction is the array entry for the vertex index.
The pixel stage uses given direction to sample the environment map.
Of course, this Program needs a Shader to be used :
Past the creation, we assign the program and prepare the constant buffer, texture and sampler.
The unknown bit is the slot : it will feed the camera corner directions to the program, in world space. The camera having 4 corners, this is precisely what we get in the HLSL code as an array of float4.
Now that we have everything set, all shaders required and the compositor, we have a last step to do : we need to specify that we want the compositor to be used when rendering.
For that :
So that we can work with it :
The RenderContext can have a Compositor assigned. When rendering the context, the attached compositor will be used. If no compositor is given, the default one set within the CompositorManager is used.
While it can be overriden, the default compositor in the nkGraphics component will :
Creating the rendering we had up till now. Now however, we altered the Compositor that should be used by the context. As such when launching the program, we should get something different :
Now all secrets about image composition are unveiled. In short, the process is :
And with all of that, the rendering logic will be totally overriden by the behaviour specified. Which concludes this tutorial !